Class-Wise Fully Convolutional Network for Semantic Segmentation of Remote Sensing Images
نویسندگان
چکیده
Semantic segmentation is a fundamental task in remote sensing image interpretation, which aims to assign semantic label for every pixel the given image. Accurate still challenging due complex distributions of various ground objects. With development deep learning, series networks represented by fully convolutional network (FCN) has made remarkable progress on this problem, but accuracy far from expectations. This paper focuses importance class-specific features different land cover objects, and presents novel end-to-end class-wise processing framework segmentation. The proposed FCN (C-FCN) shaped form an encoder-decoder structure with skip-connections, encoder shared produce general all categories decoder process features. To be detailed, transition (CT), up-sampling (CU), supervision (CS), classification (CC) modules are designed achieve transfer, recover resolution feature maps, bridge modified decoder, implement classifications, respectively. Class-wise group convolutions adopted architecture regard control parameter numbers. method tested public ISPRS 2D labeling benchmark datasets. Experimental results show that C-FCN significantly improves performances compared many state-of-the-art FCN-based networks, revealing its potentials accurate images.
منابع مشابه
Maritime Semantic Labeling of Optical Remote Sensing Images with Multi-Scale Fully Convolutional Network
In current remote sensing literature, the problems of sea-land segmentation and ship detection (including in-dock ships) are investigated separately despite the high correlation between them. This inhibits joint optimization and makes the implementation of the methods highly complicated. In this paper, we propose a novel fully convolutional network to accomplish the two tasks simultaneously, in...
متن کاملEffective Fusion of Multi-Modal Remote Sensing Data in a Fully Convolutional Network for Semantic Labeling
In recent years, Fully Convolutional Networks (FCN) have led to a great improvement of semantic labeling for various applications including multi-modal remote sensing data. Although different fusion strategies have been reported for multi-modal data, there is no in-depth study of the reasons of performance limits. For example, it is unclear, why an early fusion of multi-modal data in FCN does n...
متن کاملTraining Bit Fully Convolutional Network for Fast Semantic Segmentation
Fully convolutional neural networks give accurate, per-pixel prediction for input images and have applications like semantic segmentation. However, a typical FCN usually requires lots of floating point computation and large run-time memory, which effectively limits its usability. We propose a method to train Bit Fully Convolution Network (BFCN), a fully convolutional neural network that has low...
متن کاملFully Point-wise Convolutional Neural Network for Modeling Statistical Regularities in Natural Images
Modeling statistical regularities is the problem of representing the pixel distributions in natural images, and usually applied to solve the ill-posed image processing problems. In this paper, we present an extremely efficient CNN architecture for modeling statistical regularities. Our method is based on the observation that, by random sampling the pixels in natural images, we can obtain a set ...
متن کاملImproving Fully Convolution Network for Semantic Segmentation
Fully Convolution Networks (FCN) have achieved great success in dense prediction tasks including semantic segmentation. In this paper, we start from discussing FCN by understanding its architecture limitations in building a strong segmentation network. Next, we present our Improved Fully Convolution Network (IFCN). In contrast to FCN, IFCN introduces a context network that progressively expands...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Remote Sensing
سال: 2021
ISSN: ['2315-4632', '2315-4675']
DOI: https://doi.org/10.3390/rs13163211